We propose a decomposition framework for the\udparallel optimization of the sum of a differentiable (possibly\udnonconvex) function and a nonsmooth (possibly nonseparable),\udconvex one. The latter term is usually employed to enforce\udstructure in the solution, typically sparsity. The main contribution\udof this work is a novel parallel, hybrid random/deterministic\uddecomposition scheme wherein, at each iteration, a subset of\ud(block) variables is updated at the same time by minimizing a\udconvex surrogate of the original nonconvex function. To tackle\udhuge-scale problems, the (block) variables to be updated are\udchosen according to a mixed random and deterministic procedure,\udwhich captures the advantages of both pure deterministic and\udrandom update-based schemes. Almost sure convergence of the\udproposed scheme is established. Numerical results show that on\udhuge-scale problems the proposed hybrid random/deterministic\udalgorithm compares favorably to random and deterministic\udschemes on both convex and nonconvex problems.
展开▼